Algorithms and matching lower bounds for approximately-convex optimization

نویسندگان

  • Andrej Risteski
  • Yuanzhi Li
چکیده

In recent years, a rapidly increasing number of applications in practice requires optimizing non-convex objectives, like training neural networks, learning graphical models, maximum likelihood estimation. Though simple heuristics such as gradient descent with very few modifications tend to work well, theoretical understanding is very weak. We consider possibly the most natural class of non-convex functions where one could hope to obtain provable guarantees: functions that are “approximately convex”, i.e. functions f̃ : R → R for which there exists a convex function f such that for all x, |f̃(x)− f(x)| ≤ ∆ for a fixed value ∆. We then want to minimize f̃ , i.e. output a point x̃ such that f̃(x̃) ≤ minx f̃(x) + . It is quite natural to conjecture that for fixed , the problem gets harder for larger ∆, however, the exact dependency of and ∆ is not known. In this paper, we significantly improve the known lower bound on ∆ as a function of and an algorithm matching this lower bound for a natural class of convex bodies. More precisely, we identify a function T : R → R such that when ∆ = O(T ( )), we can give an algorithm that outputs a point x̃ such that f̃(x̃) ≤ minx f̃(x) + within time poly ( d, 1 ) . On the other hand, when ∆ = Ω(T ( )), we also prove an information theoretic lower bound that any algorithm that outputs such a x̃ must use super polynomial number of evaluations of f̃ .

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

On Lower and Upper Bounds in Smooth Strongly Convex Optimization - A Unified Approach via Linear Iterative Methods

In this thesis we develop a novel framework to study smooth and strongly convex optimization algorithms, both deterministic and stochastic. Focusing on quadratic functions we are able to examine optimization algorithms as a recursive application of linear operators. This, in turn, reveals a powerful connection between a class of optimization algorithms and the analytic theory of polynomials whe...

متن کامل

Statistical Query Algorithms for Mean Vector Estimation and Stochastic Convex Optimization

Stochastic convex optimization, where the objective is the expectation of a random convex function, is an important and widely used method with numerous applications in machine learning, statistics, operations research and other areas. We study the complexity of stochastic convex optimization given only statistical query (SQ) access to the objective function. We show that well-known and popular...

متن کامل

Statistical Query Algorithms for Stochastic Convex Optimization

Stochastic convex optimization, where the objective is the expectation of a random convex function, is an important and widely used method with numerous applications in machine learning, statistics, operations research and other areas. We study the complexity of stochastic convex optimization given only statistical query (SQ) access to the objective function. We show that well-known and popular...

متن کامل

Differentially Private Empirical Risk Minimization: Efficient Algorithms and Tight Error Bounds

In this paper, we initiate a systematic investigation of differentially private algorithms for convex empirical risk minimization. Various instantiations of this problem have been studied before. We provide new algorithms and matching lower bounds for private ERM assuming only that each data point’s contribution to the loss function is Lipschitz bounded and that the domain of optimization is bo...

متن کامل

Communication Lower Bounds for Distributed Convex Optimization: Partition Data on Features

Recently, there has been an increasing interest in designing distributed convex optimization algorithms under the setting where the data matrix is partitioned on features. Algorithms under this setting sometimes have many advantages over those under the setting where data is partitioned on samples, especially when the number of features is huge. Therefore, it is important to understand the inhe...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016